Spatial Audio with the W3C Architecture for Multimodal Interfaces
نویسندگان
چکیده
The development of multimodal applications is still hampered by the necessity to integrate various technologies and frameworks into a coherent application. In 2012, the W3C proposed a multimodal architecture, standardizing the overall structure and events passed between the constituting components in a multimodal application. In this paper, we present our experiences with implementing a multimodal application employing spatial audio, text-to-speech and XHTML.
منابع مشابه
Workshop on W3c's Multimodal Architecture and Interfaces Kinesthetic Input Modalities for the W3c Multimodal Architecture
Deutsche Telekom Laboratories and T-Systems recently developed various multimodal prototype applications and modules allowing kinesthetic input, i.e. devices that can be moved around in order to alter the applications’ state. We developed first prototypes were using proprietary technologies and principles, whereas latest demonstrators more and more follow the W3C’s Multimodal Architecture. This...
متن کاملUsing SCXML to Integrate Semantic Sensor Information into Context-aware User Interfaces
This paper describes a novel architecture to introduce automatic annotation and processing of semantic sensor data within context-aware applications. Based on the well-known state-charts technologies, and represented using W3C SCXML language combined with Semantic Web technologies, our architecture is able to provide enriched higher-level semantic representations of user’s context. This capabil...
متن کاملA Prolog Datamodel for State Chart XML
SCXML was proposed as one description language for dialog control in the W3C Multimodal Architecture but lacks the facilities required for grounding and reasoning. This prohibits the application of many dialog modeling techniques for multimodal applications following this W3C standard. By extending SCXML with a Prolog datamodel and scripting language, we enable those techniques to be employed a...
متن کاملDesigning and Executing Multimodal Interfaces for the Web based on State Chart XML
The design and implementation of multimodal interfaces that support a variety of modes to enable natural interaction is still limited. We propose a multimodal interaction web framework that considers on the one hand the current W3C standardization activities, such as Start Chart XML and the Model-Based UI Working Group. But on the other hand it implements recent research results enabling the di...
متن کاملDirk Schnelle - Walka Technische Universität Darmstadt
My research interests lie in the area of Collaborative User Interfaces for Smart Spaces. Generally these interfaces are multimodal with a special focus on voice as one the major modalities. For me these interfaces should have a good user experience on the one hand and a good developer experience on the other hand. Dialog systems that are easy to implement most likely rely on a finite state mach...
متن کامل